517 research outputs found

    Reply

    Get PDF

    Visual Estimation of Fingertip Pressure on Diverse Surfaces using Easily Captured Data

    Full text link
    People often use their hands to make contact with the world and apply pressure. Machine perception of this important human activity could be widely applied. Prior research has shown that deep models can estimate hand pressure based on a single RGB image. Yet, evaluations have been limited to controlled settings, since performance relies on training data with high-resolution pressure measurements that are difficult to obtain. We present a novel approach that enables diverse data to be captured with only an RGB camera and a cooperative participant. Our key insight is that people can be prompted to perform actions that correspond with categorical labels describing contact pressure (contact labels), and that the resulting weakly labeled data can be used to train models that perform well under varied conditions. We demonstrate the effectiveness of our approach by training on a novel dataset with 51 participants making fingertip contact with instrumented and uninstrumented objects. Our network, ContactLabelNet, dramatically outperforms prior work, performs well under diverse conditions, and matched or exceeded the performance of human annotators
    corecore